Goto

Collaborating Authors

 chat interface


OnGoal: Tracking and Visualizing Conversational Goals in Multi-Turn Dialogue with Large Language Models

Coscia, Adam, Guo, Shunan, Koh, Eunyee, Endert, Alex

arXiv.org Artificial Intelligence

As multi-turn dialogues with large language models (LLMs) grow longer and more complex, how can users better evaluate and review progress on their conversational goals? We present OnGoal, an LLM chat interface that helps users better manage goal progress. OnGoal provides real-time feedback on goal alignment through LLM-assisted evaluation, explanations for evaluation results with examples, and overviews of goal progression over time, enabling users to navigate complex dialogues more effectively. Through a study with 20 participants on a writing task, we evaluate OnGoal against a baseline chat interface without goal tracking. Using OnGoal, participants spent less time and effort to achieve their goals while exploring new prompting strategies to overcome miscommunication, suggesting tracking and visualizing goals can enhance engagement and resilience in LLM dialogues. Our findings inspired design implications for future LLM chat interfaces that improve goal communication, reduce cognitive load, enhance interactivity, and enable feedback to improve LLM performance.


SMARTAPS: Tool-augmented LLMs for Operations Management

Yu, Timothy Tin Long, Mostajabdaveh, Mahdi, Byusa, Jabo Serge, Ramamonjison, Rindra, Carenini, Giuseppe, Mao, Kun, Zhou, Zirui, Zhang, Yong

arXiv.org Artificial Intelligence

Large language models (LLMs) present intriguing opportunities to enhance user interaction with traditional algorithms and tools in real-world applications. An advanced planning system (APS) is a sophisticated software that leverages optimization to help operations planners create, interpret, and modify an operational plan. While highly beneficial, many customers are priced out of using an APS due to the ongoing costs of consultants responsible for customization and maintenance. To address the need for a more accessible APS expressed by supply chain planners, we present SmartAPS, a conversational system built on a tool-augmented LLM. Our system provides operations planners with an intuitive natural language chat interface, allowing them to query information, perform counterfactual reasoning, receive recommendations, and execute scenario analysis to better manage their operation. A short video demonstrating the system has been released: https://youtu.be/KtIrJjlDbyw


Enabling Rapid Shared Human-AI Mental Model Alignment via the After-Action Review

Gu, Edward, Siu, Ho Chit, Platt, Melanie, Hurley, Isabelle, Peña, Jaime, Paleja, Rohan

arXiv.org Artificial Intelligence

In this work, we present two novel contributions toward improving research in human-machine teaming (HMT): 1) a Minecraft testbed to accelerate testing and deployment of collaborative AI agents and 2) a tool to allow users to revisit and analyze behaviors within an HMT episode to facilitate shared mental model development. Our browser-based Minecraft testbed allows for rapid testing of collaborative agents in a continuous-space, real-time, partially-observable environment with real humans without cumbersome setup typical to human-AI interaction user studies. As Minecraft has an extensive player base and a rich ecosystem of pre-built AI agents, we hope this contribution can help to facilitate research quickly in the design of new collaborative agents and in understanding different human factors within HMT. Our mental model alignment tool facilitates user-led post-mission analysis by including video displays of first-person perspectives of the team members (i.e., the human and AI) that can be replayed, and a chat interface that leverages GPT -4 to provide answers to various queries regarding the AI's experiences and model details.


LiteWebAgent: The Open-Source Suite for VLM-Based Web-Agent Applications

Zhang, Danqing, Rama, Balaji, Ni, Jingyi, He, Shiying, Zhao, Fu, Chen, Kunyu, Chen, Arnold, Cao, Junyu

arXiv.org Artificial Intelligence

We introduce LiteWebAgent, an open-source suite for VLM-based web agent applications. Our framework addresses a critical gap in the web agent ecosystem with a production-ready solution that combines minimal serverless backend configuration, intuitive user and browser interfaces, and extensible research capabilities in agent planning, memory, and tree search. For the core LiteWebAgent agent framework, we implemented a simple yet effective baseline using recursive function calling, providing with decoupled action generation and action grounding. In addition, we integrate advanced research components such as agent planning, agent workflow memory, and tree search in a modular and extensible manner. We then integrate the LiteWebAgent agent framework with frontend and backend as deployed systems in two formats: (1) a production Vercel-based web application, which provides users with an agent-controlled remote browser, (2) a Chrome extension leveraging LiteWebAgent's API to control an existing Chrome browser via CDP (Chrome DevTools Protocol). The LiteWebAgent framework is available at https://github.com/PathOnAI/LiteWebAgent, with deployed frontend at https://lite-web-agent.vercel.app/.


Remote Life Support Robot Interface System for Global Task Planning and Local Action Expansion Using Foundation Models

Obinata, Yoshiki, Jia, Haoyu, Kawaharazuka, Kento, Kanazawa, Naoaki, Okada, Kei

arXiv.org Artificial Intelligence

Robot systems capable of executing tasks based on language instructions have been actively researched. It is challenging to convey uncertain information that can only be determined on-site with a single language instruction to the robot. In this study, we propose a system that includes ambiguous parts as template variables in language instructions to communicate the information to be collected and the options to be presented to the robot for predictable uncertain events. This study implements prompt generation for each robot action function based on template variables to collect information, and a feedback system for presenting and selecting options based on template variables for user-to-robot communication. The effectiveness of the proposed system was demonstrated through its application to real-life support tasks performed by the robot.


Microsoft is simplifying Teams chat into a single unified interface

PCWorld

Microsoft is redesigning its Teams chat interface -- in a way that mimics the organization of, say, PCWorld's home page. Microsoft said Monday that it will begin testing a new Teams interface in November, optionally collapsing various channels, teams, and chat options into a single feed. Customers with access to the Teams public preview will be able to try out this new interface next month, Microsoft said today. On the surface, the new chat interface appears similar to how PCWorld organizes our latest articles. What we call a "crawl" of articles progresses down the PCWorld home page, mixing in news, tips, how-tos, reviews, and more.


NESTLE: a No-Code Tool for Statistical Analysis of Legal Corpus

Cho, Kyoungyeon, Han, Seungkum, Choi, Young Rok, Hwang, Wonseok

arXiv.org Artificial Intelligence

The statistical analysis of large scale legal corpus can provide valuable legal insights. For such analysis one needs to (1) select a subset of the corpus using document retrieval tools, (2) structure text using information extraction (IE) systems, and (3) visualize the data for the statistical analysis. Each process demands either specialized tools or programming skills whereas no comprehensive unified "no-code" tools have been available. Here we provide NESTLE, a no-code tool for large-scale statistical analysis of legal corpus. Powered by a Large Language Model (LLM) and the internal custom end-to-end IE system, NESTLE can extract any type of information that has not been predefined in the IE system opening up the possibility of unlimited customizable statistical analysis of the corpus without writing a single line of code. We validate our system on 15 Korean precedent IE tasks and 3 legal text classification tasks from LexGLUE. The comprehensive experiments reveal NESTLE can achieve GPT-4 comparable performance by training the internal IE module with 4 human-labeled, and 192 LLM-labeled examples.


Introducing `askgpt`: a chat interface that helps you to learn R!

#artificialintelligence

Everyone is talking about AI at the moment. So when I talked to my collogues Mariken and Kasper the other day about how to make teaching R more engaging and how to help students overcome their problems, it is no big surprise that the conversation eventually found it’s way to the large language model GPT-3.5 by OpenAI and the chat interface ChatGPT. It’s advantages for learning R (or any programming languages) are rather obvious: you get help on exactly your path to learning – which is different for everyone of us you can ask the model anything without anxiety about what it might think of you it can answer instantaneously So I got to work implementing a few of the functionalities I wish I had available when I first started with R. The resulting package was just released on CRAN and I wanted to write this post to highlight a few of the way you can use it to make learning or teaching easier. You can install it now like so install.packages("askgpt") Or get the development version: remotes::install_github("JBGruber/askgpt") A Simple Chat Interface Directly in R The main function, askgpt(), is very similar to ChatGPT, only directly in R: library(askgpt) askgpt("Can you explain how functions work in R?") #> Functions in R are a set of pre-defined or user-defined set of instructions that can take inputs, perform specific calculations or operations, and return outputs. Functions can be used to automate repetitive tasks, combine multiple operations into a single step, and create more complex programs. #> #> In R, function definitions can be created using the `function()` keyword. The basic syntax of a function definition in R is as follows: #> #> ``` #> function_name # Function code goes here #> return(output) #> } #> ``` #> #> Here, `function_name` is the name you choose for your function, `argument1`, `argument2`, etc. are the inputs to the function (also called parameters), and `output` is the value that the function returns. #> #> To use a function in R, you simply call it by its name and supply any necessary arguments: #> #> ``` #> function_name(argument1, argument2, ...) #> ``` #> #> R has a large number of built-in functions that perform a wide variety of tasks. For example, the `sum()` function adds up all the values of a given vector, while the `mean()` function calculates the average. #> #> In addition to using pre-defined functions in R, you can also create your own custom functions based on your specific needs. By combining multiple functions and operations in a single custom function, you can create powerful tools for data analysis and modeling. askgpt("How do you make a histogram with ggplot2?") #> To make a histogram with ggplot2, follow these steps: #> #> 1. Load the ggplot2 library using the `library()` function. #> ``` #> library(ggplot2) #> ``` #> #> 2. Prepare your data. Create a vector or data frame that contains the values you want to plot. #> #> 3. Create a ggplot object using the `ggplot()` function. Pass in the name of the data frame as an argument. #> #> ``` #> ggplot(data = your_data_frame) #> ``` #> #> 4. Add a histogram layer to the plot using the `geom_histogram()` function. Pass in the name of the column you want to use for your histogram as the `mapping` argument. #> #> ``` #> ggplot(data = your_data_frame, #> mapping = aes(x = your_column_name)) + #> geom_histogram() #> ``` #> #> 5. Customize the plot as desired using various ggplot2 functions, such as `labs()` for axis labels and titles, `theme()` for plot themes, and `scale_x_continuous()` and `scale_y_continuous()` for adjusting the axis limits and tick marks. #> #> ``` #> ggplot(data = your_data_frame, #> mapping = aes(x = your_column_name)) + #> geom_histogram() + #> labs(x = "X axis label", #> y = "Y axis label", #> title = "Histogram Title") + #> theme_bw() + #> scale_x_continuous(limits = c(0, 100), #> breaks = seq(0, 100, 10), #> expand = c(0, 0)) + #> scale_y_continuous(limits = c(0, 20), #> breaks = seq(0, 20, 2), #> expand = c(0, 0)) #> ``` #> #> 6. Use the `ggsave()` function to save the plot to a file. #> #> ``` #> ggsave(file = "path/to/save/file.png", width = 6, height = 4, dpi = 300) #> ``` #> #> Note: Replace `your_data_frame` and `your_column_name` with the actual names of your data frame and column, respectively. Adjust the axis limits and tick marks according to your needs. To make setting things up as easily as possible, the above lines will prompt you to log into your OpenAI account and generate an API key that is automatically saved for the future once entered into RStudio. The chat also remembers the previous conversation, so you can always ask it to elaborate or explain something differently. askgpt("I don't understand what you mean. Can you explain this for beginners?") #> Sure, let's break down the steps to create a histogram using ggplot2 in R for beginners: #> #> 1. First, you need to load the ggplot2 package in your R session. You can do this by running the following command: #> #> ``` #> library(ggplot2) #> ``` #> #> 2. Next, you need to import or create a data frame that contains the data you want to plot. For example, let's create a vector of random numbers using the `rnorm()` function: #> #> ``` #> my_data ``` #> #> This creates a vector of 1000 normal-distributed random numbers with a mean of 10 and a standard deviation of 2. #> #> 3. Now, you can create a basic ggplot object using the `ggplot()` function, passing in the name of the data frame as an argument: #> #> ``` #> my_plot ``` #> #> 4. Then, you can add a histogram layer on top of the ggplot object using the `geom_histogram()` function: #> #> ``` #> my_plot ``` #> #> Here, `binwidth` is an argument that specifies the size of each histogram bin - we've set it to 0.5, but you can adjust it depending on your data. #> #> 5. Finally, you can customize the histogram by adding axis labels, a title, changing the bin width, etc. Here's an example: #> #> ``` #> my_plot geom_histogram(binwidth = 0.5, fill = "blue", #> alpha = 0.5, colour = "black") + #> labs(title = "My Histogram", x = "Values", y = "Frequency") #> ``` #> #> In this example, we've added axis labels and a title to the plot, set the bin width to 0.5, filled the histogram bars with blue and made them semi-transparent with `alpha`, and set the border color to black. #> #> 6. You can then display the plot using `print(my_plot)` or simply `my_plot`. #> #> ``` #> print(my_plot) #> ``` #> #> 7. If you want to save the plot to your computer, you can use the `ggsave()` function: #> #> ``` #> ggsave("my_histogram.png", my_plot) #> ``` #> #> Here, we're saving the histogram as a PNG file named "my_histogram.png". You can specify the path where you want to save the file, and adjust the height and width using the `width` and `height` arguments. #> #> Hope this helps! Error Explanation One thing that endlessly frustrated me when I first learned R were the sometimes rather cryptic error messages. askgpt solved this problem by logging your errors and sending them to the OpenAI API when prompted. This works by default for tidyverse errors, but you need to enable logging first for other error messages: log_init() mean[1] askgpt("What is wrong with my last command?") #> The error message "object of type 'closure' is not subsettable" occurs when you try to subset or extract a portion of an R function or a closure object, which is not allowed. #> #> For example, if you try to index a function by using the `[ ]` operator, the error message will appear. This is because functions are not indexable or subsettable objects in R. #> #> Here's an example code that produces this error: #> #> ```r #> # defining a function #> myFun x^2 #> } #> #> # trying to subset the function with index #> myFun[1:3] #> ``` #> #> When you run this code, you'll get the error message: #> #> ``` #> Error in myFun[1:3] : object of type 'closure' is not subsettable #> ``` #> #> The error message is telling you that you can't subset the `myFun` function since it is not a data object with indexable elements. #> #> To fix this error, you need to make sure that you are not trying to subset or extract a portion of a function or closure object. Instead, you should use the function or closure as it was intended to be used. If you want to extract some value or output from the function, you can assign it to a variable or use it as an argument in another function call. “What is wrong with my last command?” in this case is a special trigger that sends your last error message and the code that produced it. "help!" is a short alias and does the same thing. Addin for Teaching The package also comes with several RStudio addins that solve some common functions for leaning or teaching R and for developing packages. The biggest one is the Tutorialise adding. Let’s say, you have the code for a tutorial ready and a general plan on how to proceed. Now the final step is to make this into a class with explanations for the code and some examples. Highlight the code and select Tutorialise Code from the Addins menu: Other Addins At the moment, there are four more addins. 2 targeted at people learning R, two for R developers: Explain Code sends the highlighted code to the API and returns the answer in the Console Annotate Code adds comments to the highlighted code directly in the R script Document Code documents functions using roxygen2 syntax Write Test creates a testthat style unit test for a highlighted function Configuration You can configure how askgpt sends API requests by using options that start with askgpt_*. For example, to use a different model to use in askgpt() use options(askgpt_chat_model = "gpt-3.5-turbo-0301"). If you use the completions instead of the chat API (chat = FALSE in askgpt()) use options(askgpt_completions_model = "text-curie-001"). It does not matter if the API parameter is listed in the function or not. All are used. See the complete list here and here. The most important setting, however, is askgpt_config. This can be used to configure the chat using plain English: options(askgpt_config = "I'm 8 years old, please explain things easily") askgpt("What is an R function?") #> #> ── Answer ────────────────────────────────────────────────────────────────────── #> An R function is like giving your friend a set of instructions to perform a #> particular task. In R programming, a function is a set of instructions or steps #> that is given a name, and when you call that name, the function will perform #> those instructions. A function can take information or inputs, do something #> with those inputs (like adding or subtracting), and then give the result back #> as output. #> #> For example, think about giving your friend the instructions to make a peanut #> butter sandwich. The instructions might be: #> #> 1. Take two slices of bread 2. Spread peanut butter on one slice 3. Spread #> jelly on the other slice 4. Put the two slices together #> #> In R, a function might take a number (like 5) and add 1 to it, and then return #> the result (which would be 6). #> #> Functions in R are used to make code easier to use, understand, and reuse. They #> can also help programmers write complex and efficient programs. Technical Details on the Conversation History One more rather technical detail about the package is that the conversation history is not kept locally (I mean OpenAI is definitly storing your requests somewhere, but it is not used inside the conversation). Rather, the questions and answers are stored in the R environment. You can access it using the function prompt_history() and response_history(): prompt_history() #> [1] "Can you explain how functions work in R?" #> [2] "How do you make a histogram with ggplot2?" #> [3] "I don't understand what you mean. Can you explain this for beginners?" #> [4] "explain why this R code does not work:\nNULL\n\"object of type 'closure' is not subsettable\"" response_history() #> [1] "Yes, of course! \n\nFunctions in R are like self-contained units of code that perform a specific task. They are used to create reusable code to avoid writing the same task again and again. In R, we use pre-defined inbuilt functions or we create our own functions as per our requirement. \n\nHere's how a simple function works in R:\n\n```r\n# Creating a function:\nmy_function


We Programmed ChatGPT Into This Article. It's Weird.

The Atlantic - Technology

ChatGPT, the internet-famous AI text generator, has taken on a new form. Once a website you could visit, it is now a service that you can integrate into software of all kinds, from spreadsheet programs to delivery apps to magazine websites such as this one. Snapchat added ChatGPT to its chat service (it suggested that users might type "Can you write me a haiku about my cheese-obsessed friend Lukas?"), and Instacart plans to add a recipe robot. They will be weirder than you might think. Instead of one big AI chat app that delivers knowledge or cheese poetry, the ChatGPT service (and others like it) will become an AI confetti bomb that sticks to everything.


With ChatGPT, the new Bing wants to be your 'AI-powered copilot for the web'

PCWorld

Microsoft unveiled the "age of AI" on Tuesday, with new conversational ChatGPT features for Bing described as the "AI-powered copilot for the web." Microsoft will add contextual searches to Bing, powered by its own version of the ChatGPT algorithm. Microsoft will also integrate a separate, dedicated chat interface for Bing, complete with footnoted links. Finally, AI will be integrated into Edge, allowing to summarize a financial earnings report, for example. Bing's new search engine interface is live, but to a limited number of people.